ABSTRACT
In recent years, several machine learning models were successfully deployed in various fields. However, a huge quantity of data is required for training good machine learning. Data are distributivity stored across multiple sources and centralizing those data leads to privacy and security issues. To solve this problem, the proposed federated-based method works by exchanging the parameters of three locally trained machine learning models without compromising privacy. Each machine learning model uses the e-adoption of CT scans for improving their training knowledge. The CT scans are electronically transferred between various medical centers. Proper care is taken to prevent identify loss from the e-adopted data. To normalize the parameters, a novel weighting scheme is also exchanged along with the parameters. Thus, the global model is trained with more heterogeneous samples to increase performance. Based on the experiment, the proposed algorithm has obtained 89% of accuracy, which is 32% more than the existing machine learning models.
ABSTRACT
In recent years, several machine learning models were successfully deployed in various fields. However, a huge quantity of data is required for training good machine learning. Data are distributivity stored across multiple sources and centralizing those data leads to privacy and security issues. To solve this problem, the proposed federated-based method works by exchanging the parameters of three locally trained machine learning models without compromising privacy. Each machine learning model uses the e-adoption of CT scans for improving their training knowledge. The CT scans are electronically transferred between various medical centers. Proper care is taken to prevent identify loss from the e-adopted data. To normalize the parameters, a novel weighting scheme is also exchanged along with the parameters. Thus, the global model is trained with more heterogeneous samples to increase performance. Based on the experiment, the proposed algorithm has obtained 89% of accuracy, which is 32% more than the existing machine learning models.